Web Survey Bibliography
Survey administrators go to great lengths to make sure survey questions are easy to understand for a broad range of respondents. Despite these efforts, respondents do not always understand what the questions ask of them. In interviewer-administrated surveys, interviewers can pick up on cues from the respondent that suggest they do not understand or know how to answer the question and can provide assistance as their training allows. However, due to the high costs of interviewer administration, many surveys are moving towards other survey modes (at least for some respondents) that do not include costly interviewers, and with that a valuable source for clarification is gone. In Web surveys, researchers have experimented with providing real-time assistance to respondents who take a long time to answer a question. Help provided in such a fashion has resulted in increased accuracy, but some respondents do not like the imposition of unsolicited help. There may be alternative ways to provide help that can refine or overcome the limitations to using response times. This dissertation is organized into three separate studies that each use a set of independently collected data to identify a set of indicators survey administrators can use to determine when a respondent is having difficulty answering a question and proposes alternative ways of providing real-time assistance that increase accuracy as well as user satisfaction. The first study identifies nine movements that respondents make with the mouse cursor while answering survey questions and hypothesizes, using exploratory analyses, which movements are related to difficulty. The second study confirms use of these movements and uses hierarchical modeling to identify four movements which are the most predictive. The third study tests three different of providing unsolicited help to respondents: text box, audio recording, and chat. Accuracy and respondent satisfaction are evaluated for each mode. There were no differences in accuracy across the three modes, but participants reported a preference for receiving help in a standard text box. These findings allow survey designers to identify difficult questions on a larger scale than previously possible and to increase accuracy by providing real-time assistance while maintaining respondent satisfaction.
Digital Repository at the University of Maryland (abstract) / (full text)
Web survey bibliography - Thesis, diplomas (29)
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Designing web surveys for the multi-device internet; 2015; de Bruijne, M.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- Mixed-method feasibility study comparing the outpatient assessment of burn patients using a tablet device...; 2015; Mitchell, S. S.
- Facebook, Twitter, & Qr Codes: An Exploratory Trial Examining The Feasibility Of Social Media Mechanisms...; 2014; Gu, L. L.
- Open-ended questions in Web Surveys-Using visual and adaptive questionnaire design to improve narrative...; 2014; Emde, M.
- Design and Implementation of an Online Questionnaire Tool; 2014; Schaniel, R.
- User Modeling via Machine Learning and Rule-Based Reasoning to Understand and Predict Errors in Survey...; 2013; Stuart, L. C.
- Investigation of background acoustical effect on online surveys: A case study of a farmers' market...; 2013; Tang, Xi.
- Developing a New Mixed-Mode Methodology For a Provincial Park Camper Survey in British Columbia; 2013; Dyck, B. W.
- Classifying Mouse Movements and Providing Help in Web Surveys; 2013; Horwitz, R.
- Satisficing in Web Surveys: Implications for Data Quality and Strategies for Reduction; 2013; Zhang, Che.
- “I think I know what you did last summer” Improving data quality in panel surveys; 2012; Lugtig, P. J.
- Analyzing Functionalities for Online Questionnaire System (OQS); 2012; Atown, H. Y.
- Web panels in Slovenia; 2011; Lenar, J.
- Clarifying Survey Questions; 2011; Redline, C. D.
- Nonresponse and Measurement Error in Mobile Phone Surveys ; 2010; Kennedy, C.
- Internet-Based Measurement With Visual Analogue Scales: An Experimental Investigation; 2010; Funke, F.
- Social Networking Sites: Evaluating and Investigating their use in Academic Research; 2010; Redmond, F.
- E-epidemiology : Adapting epidemiological methods for the 21st century; 2009; Bexelius, C.
- Visual Design Effects on Respondents’ Behavior in Web-Surveys; 2009; Greinoecker, A.
- Improving survey response in mail and internet general public surveys using address-based sampling and...; 2009; Messer, B. L.
- Design Variations in Adaptive Web Sampling; 2008; Vincent, K. S.
- Internet-based survey design for university web sites : a case study of a Thai university ; 2007; Vate-U-Lan, P.
- On the cost-efficiency of probability sampling based mail surveys with a Web response option; 2005; Werner, P.
- Cognitive Laboratory Experiences : On Pre-testing Computerised Questionnaires; 2002; Snijkers, G.
- (Non)Response bei Web-Befragungen; 2002; Bosnjak, M.
- Web survey errors; 2001; Lozar Manfreda, K.
- A study of factors affecting responses in electronic mail surveys; 1997; Good, K. P.